Tight Sufficient Conditions on Exact Sparsity Pattern Recovery
نویسنده
چکیده
A noisy underdetermined system of linear equations is considered in which a sparse vector (a vector with a few nonzero elements) is subject to measurement. The measurement matrix elements are drawn from a Gaussian distribution. We study the information-theoretic constraints on exact support recovery of a sparse vector from the measurement vector and matrix. We compute a tight, sufficient condition that is applied to ergodic wide-sense stationary sparse vectors. We compare our results with the existing bounds and recovery conditions. Finally, we extend our results to approximately sparse signals. Index Terms Sparsity pattern recovery, subset selection, underdetermined systems of equations.
منابع مشابه
A Sharp Sufficient Condition for Sparsity Pattern Recovery
Sufficient number of linear and noisy measurements for exact and approximate sparsity pattern/support set recovery in the high dimensional setting is derived. Although this problem as been addressed in the recent literature, there is still considerable gaps between those results and the exact limits of the perfect support set recovery. To reduce this gap, in this paper, the sufficient con...
متن کاملSharp Sufficient Conditions on Exact Sparsity Pattern Recovery
Consider the n-dimensional vector y = Xβ+ ǫ, where β ∈ R has only k nonzero entries and ǫ ∈ R is a Gaussian noise. This can be viewed as a linear system with sparsity constraints, corrupted by noise. We find a non-asymptotic upper bound on the probability that the optimal decoder for β declares a wrong sparsity pattern, given any generic perturbation matrix X . In the case when X is randomly dr...
متن کاملRank-Sparsity Incoherence for Matrix Decomposition
Suppose we are given a matrix that is formed by adding an unknown sparse matrix to an unknown low-rank matrix. Our goal is to decompose the given matrix into its sparse and low-rank components. Such a problem arises in a number of applications in model and system identification, and is NP-hard in general. In this paper we consider a convex optimization formulation to splitting the specified mat...
متن کاملSharp thresholds for high-dimensional and noisy sparsity recovery using l1-constrained quadratic programming (Lasso)
The problem of consistently estimating the sparsity pattern of a vector β ∈ R based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is...
متن کاملSharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using -Constrained Quadratic Programming (Lasso)
The problem of consistently estimating the sparsity pattern of a vector based on observations contaminated by noise arises in various contexts, including signal denoising, sparse approximation, compressed sensing, and model selection. We analyze the behavior of -constrained quadratic programming (QP), also referred to as the Lasso, for recovering the sparsity pattern. Our main result is to esta...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1209.4209 شماره
صفحات -
تاریخ انتشار 2012